64 research outputs found

    Results and recommendations from an intercomparison of six Hygroscopicity-TDMA systems

    Get PDF
    The performance of six custom-built Hygrocopicity-Tandem Differential Mobility Analyser (H-TDMA) systems was investigated in the frame of an international calibration and intercomparison workshop held in Leipzig, February 2006. The goal of the workshop was to harmonise H-TDMA measurements and develop recommendations for atmospheric measurements and their data evaluation. The H-TDMA systems were compared in terms of the sizing of dry particles, relative humidity (RH) uncertainty, and consistency in determination of number fractions of different hygroscopic particle groups. The experiments were performed in an air-conditioned laboratory using ammonium sulphate particles or an external mixture of ammonium sulphate and soot particles. The sizing of dry particles of the six H-TDMA systems was within 0.2 to 4.2% of the selected particle diameter depending on investigated size and individual system. Measurements of ammonium sulphate aerosol found deviations equivalent to 4.5% RH from the set point of 90% RH compared to results from previous experiments in the literature. Evaluation of the number fraction of particles within the clearly separated growth factor modes of a laboratory generated externally mixed aerosol was done. The data from the H-TDMAs was analysed with a single fitting routine to investigate differences caused by the different data evaluation procedures used for each H-TDMA. The differences between the H-TDMAs were reduced from +12/-13% to +8/-6% when the same analysis routine was applied. We conclude that a common data evaluation procedure to determine number fractions of externally mixed aerosols will improve the comparability of H-TDMA measurements. It is recommended to ensure proper calibration of all flow, temperature and RH sensors in the systems. It is most important to thermally insulate the aerosol humidification unit and the second DMA and to monitor these temperatures to an accuracy of 0.2 degrees C. For the correct determination of external mixtures, it is necessary to take into account size-dependent losses due to diffusion in the plumbing between the DMAs and in the aerosol humidification unit.Peer reviewe

    CLOUD: an atmospheric research facility at CERN

    Get PDF
    This report is the second of two addenda to the CLOUD proposal at CERN (physics/0104048), which aims to test experimentally the existence a link between cosmic rays and cloud formation, and to understand the microphysical mechanism. The document places CLOUD in the framework of a CERN facility for atmospheric research, and provides further details on the particle beam requirements

    A study of the link between cosmic rays and clouds with a cloud chamber at the CERN PS

    Get PDF
    Recent satellite data have revealed a surprising correlation between galactic cosmic ray (GCR) intensity and the fraction of the Earth covered by clouds. If this correlation were to be established by a causal mechanism, it could provide a crucial step in understanding the long-sought mechanism connecting solar and climate variability. The Earth's climate seems to be remarkably sensitive to solar activity, but variations of the Sun's electromagnetic radiation appear to be too small to account for the observed climate variability. However, since the GCR intensity is strongly modulated by the solar wind, a GCR-cloud link may provide a sufficient amplifying mechanism. Moreover if this connection were to be confirmed, it could have profound consequences for our understanding of the solar contributions to the current global warming. The CLOUD (Cosmics Leaving OUtdoor Droplets) project proposes to test experimentally the existence a link between cosmic rays and cloud formation, and to understand the microphysical mechanism. CLOUD plans to perform detailed laboratory measurements in a particle beam at CERN, where all the parameters can be precisely controlled and measured. The beam will pass through an expansion cloud chamber and a reactor chamber where the atmosphere is to be duplicated by moist air charged with selected aerosols and trace condensable vapours. An array of external detectors and mass spectrometers is used to analyse the physical and chemical characteristics of the aerosols and trace gases during beam exposure. Where beam effects are found, the experiment will seek to evaluate their significance in the atmosphere by incorporating them into aerosol and cloud models.Recent satellite data have revealed a surprising correlation between galactic cosmic ray (GCR) intensity and the fraction of the Earth covered by clouds. If this correlation were to be established by a causal mechanism, it could provide a crucial step in understanding the long-sought mechanism connecting solar and climate variability. The Earth's climate seems to be remarkably sensitive to solar activity, but variations of the Sun's electromagnetic radiation appear to be too small to account for the observed climate variability. However, since the GCR intensity is strongly modulated by the solar wind, a GCR-cloud link may provide a sufficient amplifying mechanism. Moreover if this connection were to be confirmed, it could have profound consequences for our understanding of the solar contributions to the current global warming. The CLOUD (Cosmics Leaving OUtdoor Droplets) project proposes to test experimentally the existence a link between cosmic rays and cloud formation, and to understand the microphysical mechanism. CLOUD plans to perform detailed laboratory measurements in a particle beam at CERN, where all the parameters can be precisely controlled and measured. The beam will pass through an expansion cloud chamber and a reactor chamber where the atmosphere is to be duplicated by moist air charged with selected aerosols and trace condensable vapours. An array of external detectors and mass spectrometers is used to analyse the physical and chemical characteristics of the aerosols and trace gases during beam exposure. Where beam effects are found, the experiment will seek to evaluate their significance in the atmosphere by incorporating them into aerosol and cloud models

    Toward community standards and software for whole-cell modeling

    Get PDF
    Whole-cell (WC) modeling is a promising tool for biological research, bioengineering, and medicine. However, substantial work remains to create accurate, comprehensive models of complex cells. Methods: We organized the 2015 Whole-Cell Modeling Summer School to teach WC modeling and evaluate the need for new WC modeling standards and software by recoding a recently published WC model in SBML. Results: Our analysis revealed several challenges to representing WC models using the current standards. Conclusion: We, therefore, propose several new WC modeling standards, software, and databases. Significance:We anticipate that these new standards and software will enable more comprehensive models

    Successful implementation of complex projects with decentralised and centralised management schemes

    No full text
    The thematic hypothesis of the paper is that both centralized and decentralized management schemes possess complementary features to induce successful project implementation. Centralized organizations are traditionally seen to promote control and coordination while the decentralized ones focus on autonomous units forming interactive and flexible networks ready to evolve over the execution of the project. The paper investigates two real world cases, one from centralized and one from decentralized project. Both projects deliver complex system products. The paper concludes that top level management processes of complex projects should follow strict linear model of an organization focusing on execution and control, while at the lower level the organization should resemble more of a network organization, where autonomous project teams execute their tasks. Each autonomous unit must have strictly defined interfaces with the units that are immediately up- and downstream of the concerned unit. The level where the decentralized organization should follow the strict and predetermined procedures seems to be dictated by the level of modularity in the delivered system. Drawing from the limited cases, it seems that the more modular the end product is, the more autonomous the organization orchestrating the project can be

    Towards Green Big Data at CERN

    Get PDF
    High-energy physics studies collisions of particles traveling near the speed of light. For statistically significant results, physicists need to analyze a huge number of such events. One analysis job can take days and process tens of millions of collisions. Today the experiments of the large hadron collider (LHC) create 10 GB of data per second and a future upgrade will cause a ten-fold increase in data. The data analysis requires not only massive hardware but also a lot of electricity. In this article, we discuss energy efficiency in scientific computing and review a set of intermixed approaches we have developed in our Green Big Data project to improve energy efficiency of CERN computing. These approaches include making energy consumption visible to developers and users, architectural improvements, smarter management of computing jobs, and benefits of cloud technologies. The open and innovative environment at CERN is an excellent playground for different energy efficiency ideas which can later find use in mainstream computing. (C) 2017 Elsevier B.V. All rights reserved.Peer reviewe
    corecore